33 research outputs found
BARTPhoBEiT: Pre-trained Sequence-to-Sequence and Image Transformers Models for Vietnamese Visual Question Answering
Visual Question Answering (VQA) is an intricate and demanding task that
integrates natural language processing (NLP) and computer vision (CV),
capturing the interest of researchers. The English language, renowned for its
wealth of resources, has witnessed notable advancements in both datasets and
models designed for VQA. However, there is a lack of models that target
specific countries such as Vietnam. To address this limitation, we introduce a
transformer-based Vietnamese model named BARTPhoBEiT. This model includes
pre-trained Sequence-to-Sequence and bidirectional encoder representation from
Image Transformers in Vietnamese and evaluates Vietnamese VQA datasets.
Experimental results demonstrate that our proposed model outperforms the strong
baseline and improves the state-of-the-art in six metrics: Accuracy, Precision,
Recall, F1-score, WUPS 0.0, and WUPS 0.9
ViHOS: Hate Speech Spans Detection for Vietnamese
The rise in hateful and offensive language directed at other users is one of
the adverse side effects of the increased use of social networking platforms.
This could make it difficult for human moderators to review tagged comments
filtered by classification systems. To help address this issue, we present the
ViHOS (Vietnamese Hate and Offensive Spans) dataset, the first human-annotated
corpus containing 26k spans on 11k comments. We also provide definitions of
hateful and offensive spans in Vietnamese comments as well as detailed
annotation guidelines. Besides, we conduct experiments with various
state-of-the-art models. Specifically, XLM-R achieved the best
F1-scores in Single span detection and All spans detection, while
PhoBERT obtained the highest in Multiple spans detection. Finally,
our error analysis demonstrates the difficulties in detecting specific types of
spans in our data for future research.
Disclaimer: This paper contains real comments that could be considered
profane, offensive, or abusive.Comment: EACL 202